Structured functional additive regression in reproducing kernel Hilbert spaces
نویسندگان
چکیده
منابع مشابه
Structured functional additive regression in reproducing kernel Hilbert spaces.
Functional additive models (FAMs) provide a flexible yet simple framework for regressions involving functional predictors. The utilization of data-driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting nonlinear additive components has been less studied. In this work, we propose a new regularizat...
متن کاملQuantile Regression in Reproducing Kernel Hilbert Spaces
In this paper we consider quantile regression in reproducing kernel Hilbert spaces, which we refer to as kernel quantile regression (KQR). We make three contributions: (1) we propose an efficient algorithm that computes the entire solution path of the KQR, with essentially the same computational cost as fitting one KQR model; (2) we derive a simple formula for the effective dimension of the KQR...
متن کاملReal reproducing kernel Hilbert spaces
P (α) = C(α, F (x, y)) = αF (x, x) + 2αF (x, y) + F (x, y)F (y, y), which is ≥ 0. In the case F (x, x) = 0, the fact that P ≥ 0 implies that F (x, y) = 0. In the case F (x, y) 6= 0, P (α) is a quadratic polynomial and because P ≥ 0 it follows that the discriminant of P is ≤ 0: 4F (x, y) − 4 · F (x, x) · F (x, y)F (y, y) ≤ 0. That is, F (x, y) ≤ F (x, y)F (x, x)F (y, y), and this implies that F ...
متن کاملNonparametric Logistic Regression: Reproducing Kernel Hilbert Spaces and Strong Convexity
We study maximum penalized likelihood estimation for logistic regression type problems. The usual difficulties encountered when the log-odds ratios may become large in absolute value are circumvented by imposing a priori bounds on the estimator, depending on the sample size (n) and smoothing parameter. We pay for this in the convergence rate of the mean integrated squared error by a factor logn...
متن کاملOn-Line Regression Competitive with Reproducing Kernel Hilbert Spaces
We consider the problem of on-line prediction of real-valued labels, assumed bounded in absolute value by a known constant, of new objects from known labeled objects. The prediction algorithm’s performance is measured by the squared deviation of the predictions from the actual labels. No stochastic assumptions are made about the way the labels and objects are generated. Instead, we are given a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of the Royal Statistical Society: Series B (Statistical Methodology)
سال: 2013
ISSN: 1369-7412
DOI: 10.1111/rssb.12036